Training and Generating Neural Networks in Compressed Weight Space 您所在的位置:网站首页 cayley transform Training and Generating Neural Networks in Compressed Weight Space

Training and Generating Neural Networks in Compressed Weight Space

#Training and Generating Neural Networks in Compressed Weight Space | 来源: 网络整理| 查看: 265

Training and Generating Neural Networks in Compressed Weight Space 12/31/2021 ∙ by   Kazuki Irie, et al. ∙ IDSIA ∙ 4 ∙ share

The inputs and/or outputs of some neural nets are weight matrices of other neural nets. Indirect encodings or end-to-end compression of weight matrices could help to scale such approaches. Our goal is to open a discussion on this topic, starting with recurrent neural networks for character-level language modelling whose weight matrices are encoded by the discrete cosine transform. Our fast weight version thereof uses a recurrent neural network to parameterise the compressed weights. We present experimental results on the enwik8 dataset.

READ FULL TEXT


【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有